the Wild West
noun
: the western United States in the past when there were many cowboys, outlaws, etc.
stories about the Wild West
—often used before another nounWild West stories
a Wild West show
Love words? Need even more definitions?
Merriam-Webster unabridged
Share